Explanation Consistency Training: Facilitating Consistency-Based Semi-Supervised Learning with Interpretability
نویسندگان
چکیده
Unlabeled data exploitation and interpretability are usually both required in reality. They, however, conducted independently, very few works try to connect the two. For unlabeled exploitation, state-of-the-art semi-supervised learning (SSL) results have been achieved via encouraging consistency of model output on perturbation, that is, assumption. However, it remains hard for users understand how particular decisions made by SSL models. To this end, paper we first disclose assumption is closely related causality invariance, where invariance lies main reason why valid. We then propose ECT (Explanation Consistency Training) which encourages a consistent decision under perturbation. employs explanation as surrogate output, able bridge models alleviate high complexity causality. realize ECT-SM vision ECT-ATT NLP tasks. Experimental real-world sets validate highly competitive performance better proposed algorithms.
منابع مشابه
Weight-averaged Consistency Targets Improve Semi-supervised Deep Learning Results
The recently proposed temporal ensembling has achieved state-of-the-art results in several semi-supervised learning benchmarks. It maintains an exponential moving average of label predictions on each training example, and penalizes predictions that are inconsistent with this target. However, because the targets change only once per epoch, temporal ensembling becomes unwieldy when using large da...
متن کاملTraining SpamAssassin with Active Semi-supervised Learning
Most spam filters include some automatic pattern classifiers based on machine learning and pattern recognition techniques. Such classifiers often require a large training set of labeled emails to attain a good discriminant capability between spam and legitimate emails. In addition, they must be frequently updated because of the changes introduced by spammers to their emails to evade spam filter...
متن کاملCombining Active Learning and Semi-supervised Learning Using Local and Global Consistency
Semi-supervised learning and active learning are important techniques to solve the shortage of labeled examples. In this paper, a novel active learning algorithm combining semi-supervised Learning with Local and Global Consistency (LLGC) is proposed. It selects the example that can minimize the estimated expected classification risk for labeling. Then, a better classifier can be trained with la...
متن کاملSemi-Supervised Learning Based Prediction of Musculoskeletal Disorder Risk
This study explores a semi-supervised classification approach using random forest as a base classifier to classify the low-back disorders (LBDs) risk associated with the industrial jobs. Semi-supervised classification approach uses unlabeled data together with the small number of labelled data to create a better classifier. The results obtained by the proposed approach are compared with those o...
متن کاملMean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
The recently proposed Temporal Ensembling has achieved state-of-the-art results in several semi-supervised learning benchmarks. It maintains an exponential moving average of label predictions on each training example, and penalizes predictions that are inconsistent with this target. However, because the targets change only once per epoch, Temporal Ensembling becomes unwieldy when learning large...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2021
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v35i9.16934